Rbf Neural Networks for Function Approximation in Dynamic Modelling

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Dynamic Parameter Tuning Algorithm for Rbf Neural Networks

The objective of this thesis is to present a methodology for fine-tuning the parameters of radial basis function (RBF) neural networks, thus improving their performance. Three main parameters affect the performance of an RBF network. They are the centers and widths of the RBF nodes and the weights associated with each node. A gridded center and orthogonal search algorithm have been used to init...

متن کامل

Efficient Parameters Selection for CNTFET Modelling Using Artificial Neural Networks

In this article different types of artificial neural networks (ANN) were used for CNTFET (carbon nanotube transistors) simulation. CNTFET is one of the most likely alternatives to silicon transistors due to its excellent electronic properties. In determining the accurate output drain current of CNTFET, time lapsed and accuracy of different simulation methods were compared. The training data for...

متن کامل

Comparison between Beta Wavelets Neural Networks , RBF Neural Networks and Polynomial Approximation for 1 D , 2 D Functions Approximation

This paper proposes a comparison between wavelet neural networks (WNN), RBF neural network and polynomial approximation in term of 1-D and 2-D functions approximation. We present a novel wavelet neural network, based on Beta wavelets, for 1-D and 2-D functions approximation. Our purpose is to approximate an unknown function f: Rn R from scattered samples (xi; y = f(xi)) i=1....n, where first, w...

متن کامل

Investigation of Neural Networks for Function Approximation

In this work, some ubiquitous neural networks are applied to model the landscape of a known problem function approximation. The performance of the various neural networks is analyzed and validated via some well-known benchmark problems as target functions, such as Sphere, Rastrigin, and Griewank functions. The experimental results show that among the three neural networks tested, Radial Basis F...

متن کامل

Why Deep Neural Networks for Function Approximation?

Recently there has been much interest in understanding why deep neural networks are preferred to shallow networks. We show that, for a large class of piecewise smooth functions, the number of neurons needed by a shallow network to approximate a function is exponentially larger than the corresponding number of neurons needed by a deep network for a given degree of function approximation. First, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Konbin

سال: 2008

ISSN: 2083-4608,1895-8281

DOI: 10.2478/v10040-008-0115-6